翻訳と辞書
Words near each other
・ Interaction
・ Interaction (album)
・ Interaction (disambiguation)
・ Interaction (statistics)
・ Interaction and Robotics Research Center
・ Interaction between monetary and fiscal policies
・ InterAction Council of Former Heads of State and Government
・ Interaction design
・ Interaction Design Foundation
・ Interaction Design Institute Ivrea
・ Interaction design pattern
・ Interaction energy
・ Interaction Flow Modeling Language
・ Interaction frequency
・ Interaction hypothesis
Interaction information
・ Interaction model
・ Interaction nets
・ Interaction network
・ Interaction overview diagram
・ Interaction picture
・ Interaction point
・ Interaction protocol
・ InterAction School of Performing Arts
・ Interaction Soil-Biosphere-Atmosphere
・ Interaction Styles
・ Interaction technique
・ Interaction theory
・ Interaction value analysis
・ Interaction-free measurement


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Interaction information : ウィキペディア英語版
Interaction information

The interaction information (McGill 1954), or amounts of information (Hu Kuo Ting, 1962) or co-information (Bell 2003) is one of several generalizations of the mutual information, and expresses the amount information (redundancy or synergy) bound up in a set of variables, ''beyond'' that which is present in any subset of those variables. Unlike the mutual information, the interaction information can be either positive or negative. This confusing property has likely retarded its wider adoption as an information measure in machine learning and cognitive science. These functions, their negativity and minima have a direct interpretation in algebraic topology (Baudot & Bennequin, 2015).
== The Three-Variable Case ==

For three variables \, the interaction information I(X;Y;Z) is given by
:
\begin
I(X;Y;Z) & = & I(X;Y|Z)-I(X;Y) \\
\ & = & I(X;Z|Y)-I(X;Z) \\
\ & = & I(Y;Z|X)-I(Y;Z)
\end

where, for example, I(X;Y) is the mutual information between variables X and Y, and I(X;Y|Z) is the conditional mutual information between variables X and Y given Z. Formally,
:
\begin
I(X;Y|Z) & = H(X|Z) + H(Y|Z) - H(X,Y|Z) \\
\ & = H(X|Z)-H(X|Y,Z)
\end

It thus follows that
:
\begin
I(X;Y;Z) = & - (H(X) + H(Y) + H(Z) ) \\
& + (H(X,Y) + H(X,Z) + H(Y,Z) ) \\
& - H(X,Y,Z)
\end

For the three-variable case, the interaction information I(X;Y;Z) is the difference between the information shared by \ when Z has been fixed and when Z has not been fixed. (See also Fano's 1961 textbook.) Interaction information measures the influence of a variable Z on the amount of information shared between \. Because the term I(X;Y|Z) can be zero — for example, when the
dependency between \ is due entirely to the influence of a common cause Z, the interaction information can be negative as well as positive. Negative interaction information indicates that variable Z inhibits (i.e., ''accounts for'' or ''explains'' some of) the correlation between \, whereas positive interaction information indicates that variable Z facilitates or enhances the correlation between \.
Interaction information is bounded. In the three variable case, it is bounded by
:
-min\ \ \leq I(X;Y;Z) \leq min\ \


抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Interaction information」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.